Documentation for Users  1.0.0
Perception Toolbox for Virtual Reality (PTVR) Manual
The MNREAD test in virtual reality


This example illustrates PTVR capability to display text at a given x-height. It is related to the study of reading performance, which bring essential information about subjects, especially when they are low-vision persons, as used in our context. Among various existing tests, the MNREAD test [1] is probably one of the most widely used standardized reading tests globally for measuring reading performance in clinical and research contexts. It is a reading test created by Gordon Legge (University of Minnesota), the founding father of the field of the psychophysics of reading, and a world-renowned expert in the field of low vision. The printed version is the standard. A tablet version (iPad) has recently been developed and validated [2] to facilitate the distribution and use of the test. Here we propose a PTVR version using the original MNRead sentences in English. The raw data on a subject's reading performance (reading speed and errors for each sentence) is displayed in an operator interface and saved in an output file, which will allow further analysis to extract several indicators of reading performance [3]. Performing the MNREAD test in virtual reality offers four main advantages: (1) This technology allows us to have huge print sizes, without being limited by a physical screen size since we are in a 360-degree environment. (2) It enables us to have absolute control over the experimental conditions, such as the luminance of the environment. (4) It provides the operator with a precise measurement of the distance between the patient's head and the text, which is a piece of important information to monitor and check during a reading test. (4) One can envision new behavioral studies thanks to eye-tracking recording during the test. Our hope is that this key reading test will be easily used anywhere and by any reading researcher. Thus, easily performing this standardized test in a well-controlled VR environment should help produce a large amount of reproducible data from multiple sites.

Video | Code: mnread_vr_experiment.py | For more information: The MNREAD test in virtual reality: More information

Bibliography

[1] J. S. Mansfield, S. J. Ahn, G. E. Legge, & A. Luebker (1993). A new reading-acuity chart for normal and low vision. Ophthalmic and Visual Optics/Noninvasive Assessment of the Visual System Technical Digest, Optical Society of America, pp. 232–235.

[2] A. Calabrèse, et al. (2018), A. Calabrèse, L. To, Y. He, E. Berkholtz, P. Rafian, & Gordon E Legge (2018). Comparing performance on the MNREAD iPad application with the MNREAD acuity chart. Journal of Vision, 18(1), pp. 8. doi:https://doi.org/10.1167/18.1.8

[3] K. Baskaran, A. F. Macedo, Y. He, L. Hernandez-Moreno, T. Queirós, J. S. Mansfield, & A. Calabrèse (2019). Scoring reading parameters: An inter-rater reliability study using the MNREAD chart. PLOS ONE. https://doi.org/10.1371/journal.pone.0216775